How “Algospeak” Is Rewriting What You Can—and Can’t—Say Online
The Secret Grammar of Social Media
Ever noticed people online tip-toeing around certain words—saying “unalived” instead of “killed,” calling guns “pew‑pews,” or referring to sex as “seggs”? What might sound like childish slang is, in fact, a coded dialect emerging across social media. According to a recent BBC piece, this isn’t just meme culture—it’s survival strategy. Creators tell us they’re speaking in a secret language known as algospeak, used to evade automated censorship while still discussing serious, sensitive topics.
Behind the Code: Why People Are Rewriting Their Speech
At its core, algospeak is rooted in a belief that algorithms — not human content moderators — determine what content rises to the surface or gets buried. Users fear that certain “trigger” words may lead to demonetization, suppression, or shadow banning. So they adapt, inventing euphemisms that sound ridiculous, but help their posts fly.
Tech companies like YouTube, Meta, and TikTok claim they don’t maintain fixed “banned‑words” lists. Instead, they say moderation decisions are contextual. Still, many creators feel there’s more going on beneath the surface. (The Star)
This perception of algorithmic censorship has sparked self-censorship: creators avoid “dangerous” words altogether or resort to funny, coded versions just to stay visible. (The Star)
Real-World Examples: When “Music Festival” Isn’t About Music
Consider TikTok science communicator Alex Pearlman. He says he avoids even saying “YouTube” in his TikToks, fearing that the algorithm punishes him for sending viewers to a competing platform. (The Star)
Then there’s a more provocative case: in August 2025, amid protests over U.S. immigration raids, content creators began calling the demonstrations a “music festival” — a seemingly absurd euphemism. Videos using this phrasing went viral, and some observers believed they were being suppressed. (The Star) Linguist Adam Aleksic, who wrote Algospeak: How Social Media Is Transforming the Future of Language, says the “music festival” phenomenon is a classic case of unintentional code language. (Wikipedia) Ironically, using a coded term created more engagement — which then reinforced the myth of algorithmic censorship. (The Star)
Researchers call this the “algorithmic imaginary”: the idea that people’s beliefs about how algorithms work can shape how they behave, even if the company’s actual system doesn’t operate in that way. (The Star)
But Is It All Self‑Inflicted?
Algospeak isn’t just a quirk — some argue it’s a form of digital survival. Others call it linguistic activism or a subversive art. For marginalized or politically engaged creators, euphemizing can be the only way to speak openly without being immediately suppressed.
Still, it’s more complicated than that. Experts warn that this coded way of speaking can obscure serious topics — from violence to mental health — by turning them into jargon. (The Star) For example, using “unalived” instead of “suicide” may help avoid moderation, but it also risks distancing people from the gravity of what they’re really talking about.
On the other hand, tech companies argue moderation is not as draconian as people think. They maintain that context matters — a word used in the wrong way, or with harmful intent, is more likely to be flagged. (The Star)
What Drives This Veiled Conversation?
- Profit motive: Platforms profit from engagement. They also want to appeal to advertisers, which may discourage them from promoting content with explicitly sensitive topics. (The Star)
- Opacity: Algorithms are black boxes. Creators don’t always know why a post fails — was it the topic, the phrasing, or pure chance? (The Star)
- Cultural change: Algospeak isn’t static — it evolves. What starts as a tweet or TikTok slang can make its way into mainstream conversation. (Wikipedia)
- Identity & protest: For some, using code is a way to reclaim expression. It’s not only about staying visible—it’s about resisting digital control.
Why It Matters
This isn’t just linguistic curiosity. The rise of algospeak reveals how power, money, and censorship shape not just what content exists online — but how we talk. When people adapt their vocabulary to appease algorithms, some voices get amplified, others muffled.
The bigger question: do we accept a world where “unalive” is the safest way to talk about death, or do we demand a system that supports real language — even when it’s uncomfortable?
Glossary
- Algospeak: A coded dialect used on social media to evade algorithmic moderation — e.g., “unalive” for “killed.” (Wikipedia)
- Algorithmic imaginary: The belief, real or imagined, about how algorithms moderate content; influences how users create content. (The Star)
- Self-censorship: When creators change how they speak or write because they fear being penalized by platform moderation.
Source: The Star summary of BBC News / The words you can’t say on the internet (The Star)